🇬🇧 en de 🇩🇪

Wild West properNoun

  • (historical) The western United States during the 19th-century era of settlement, commonly believed to be lawless and unruly.
Wilder Westen
Wiktionary Links